- Title
- Distributed subgradient-free stochastic optimization algorithm for nonsmooth convex functions over time-varying networks
- Creator
- Wang, Yinghui; Zhao, Wenxiao; Hong, Yiguang; Zamani, Mohsen
- Relation
- SIAM Journal on Control and Optimization Vol. 57, Issue 4, p. 2821-2842
- Publisher Link
- http://dx.doi.org/10.1137/18M119046X
- Publisher
- Society for Industrial and Applied Mathematics (SIAM)
- Resource Type
- journal article
- Date
- 2019
- Description
- In this paper we consider a distributed stochastic optimization problem without gradient/subgradient information for local objective functions and subject to local convex constraints. Objective functions may be nonsmooth and observed with stochastic noises, and the network for the distributed design is time-varying. By adding stochastic dithers to local objective functions and constructing randomized differences motivated by the Kiefer--Wolfowitz algorithm, we propose a distributed subgradient-free algorithm for finding the global minimizer with local observations. Moreover, we prove that the consensus of estimates and global minimization can be achieved with probability one over the time-varying network, and we obtain the convergence rate of the mean average of estimates as well. Finally, we give numerical examples to illustrate the performance of the proposed algorithms
- Subject
- distributed stochastic optimization; gradient-/subgradient-free; algorithm; nonsmoothness; randomized differences
- Identifier
- http://hdl.handle.net/1959.13/1462299
- Identifier
- uon:46430
- Identifier
- ISSN:0363-0129
- Language
- eng
- Reviewed
- Hits: 1489
- Visitors: 1489
- Downloads: 0
Thumbnail | File | Description | Size | Format |
---|